# Mixture of Experts model
Flexolmo 7x7B 1T RT
Apache-2.0
FlexOlmo is a new type of large language model that supports a flexible paradigm for data collaboration, allowing data owners to contribute data without relinquishing control.
Large Language Model
Transformers English

F
allenai
226
2
Qwen3 30B A3B AWQ
Apache-2.0
Qwen3-30B-A3B-AWQ is an AWQ quantized version based on the Qwen3-30B-A3B model, suitable for text generation tasks and supporting the switching between thinking mode and non-thinking mode.
Large Language Model
Transformers

Q
cognitivecomputations
14.45k
12
SAINEMO Remix
A hybrid model based on multiple 12B parameter models, specializing in Russian and English role-playing and text generation
Large Language Model
Transformers

S
Moraliane
201
36
Mixtral 8x22B V0.1 GGUF
Apache-2.0
Quantized version of Mixtral-8x22B-v0.1, using llama.cpp for quantization, supporting multiple languages and quantization types.
Large Language Model Supports Multiple Languages
M
bartowski
597
12
Featured Recommended AI Models